-
April 2nd, 2005, 09:53 PM
#481
Inactive Member
-
April 2nd, 2005, 11:51 PM
#482
HB Forum Owner
IRT Melfice: GIF (heh! now a completely free graphics format, as the copyright Compuserve had on it expired)
This is bad news, now we won't get the help of Linux nazis to get rid of GIF.
WRT JPEG2000: JPEG2000 is the 5 years old future of lossy graphics compression that, due to lack of education, carelessness, Adobe's stupidity, and other reason, hasn't been adopted yet as the new lossy web graphics standard. Here's a demonstration. With JPEG2000 you can get x times faster/smaller images and therefore websites, where x is a pretty large number.
-
April 3rd, 2005, 04:45 AM
#483
Inactive Member
Yes! I managed to fix my Dreamcast so it doesn't restart on me randomly when i play 3D games. If anyone else has this problem i will point you to a link with the tutorial. This means i can play Grandia II freely again.
OCB gained a engineering skill lvl.
-
April 3rd, 2005, 04:53 AM
#484
Senior Hostboard Member
too much wow Ocb, too much Wow. Btw congratz
-
April 4th, 2005, 04:00 AM
#485
Inactive Member
I have been think about upgrading my Geforce FX 5700 to a Geforce 6600GT. I have been doing research on the 6600GT. I noticed that the PCIe versions support the Nvidia SLI meaning that I can have two 6600GTs, two 6800GTs, or two 6800 Ultras(no mix matching), in one machine. I have been noticing that only AMD motherboards have the SLI technology. I was wondering if AMD motherboards are the only MB to suppor SLI or are there Intel Motherboards that support SLI as well. For some that do not understand, SLI is only compatible with the PCIe versions and not the AGP.
-
April 4th, 2005, 12:14 PM
#486
Inactive Member
Interesting comparative. And how is that this wonder of science and data compression is still not a standard?
-
April 4th, 2005, 12:31 PM
#487
HB Forum Owner
The 6600GT is a good choice. I wouldn't bother for SLI though (if you want more power just get a more expensive card which is less expensive than 2?6600GTs), and if you don't have PCI Express support don't change your motherboard just for that (at the moment, not even AGP4X is properly exploited; even the most texture demanding game running out of VRAM won't demand more than AGP can provide).
There's no limitation for Intel processors chipsets supporting SLI, if you can't find any you should start getting them very soon. For example, I'm sure nVidia's new chipset for Pentium 4s will support SLI.
If you're deciding between Intel and AMD it's again a though decision, but I'd personally go for AMD's 64 bit processors, which will come in handy in the future and provide a better architecture than Intel, who's kind of stuck with the same kind of core and ISA and doesn't seem to be rocking the scene as they used to in the Pentium 1 and 2 days. I haven't verified this personally and I'd look for actual data, but I've heard if you compare equal priced processors from AMD and Intel, Intel's will do better in scientific applications and AMD's will do better in games. Since today's most demanding applications are games, I would opt for this.
I hope this helps.
-
April 6th, 2005, 04:26 AM
#488
Inactive Member
with proper driver support for the SLI, do the cons outweigh the pros?
-
April 6th, 2005, 12:09 PM
#489
HB Forum Owner
IRT Melfice: how is that this wonder of science and data compression is still not a standard?
Sorry, I hadn't seen your post. Several reasons...
1. Part of the standard has some kind of legal shit around, but it was done in a way so that it doesn't prevent its free implementation and adoption. Even so, some people is reluctant to introduce it, nevermind the fact they use crappy GIFs all day and they were patented till a few weeks ago.
2. Adobe sucks; Adobe is responsible for PNGs not being as widespread as they should since their PNG compressor in Photoshop sucks ass and produces huge files. The JPEG2000 plugin is not even installed by default - Adobe is only worried about their propietary, useless PDFs and other stupid technologies.
3. Internet Explorer. Because people is uninformed or ignorant and keep using Internet Explorer, 80% of web browsers are still MSIE. Until the situation changes, nothing Mozilla adopts will become a widespread standard without support for it in MSIE.
4. The Mozilla Foundation failed to adopt and advertise the format too.
5. Everything related to open standards regulated by commitees like W3C takes foreeeeeever, it's worse than religion. Keep in mind we don't even have a vectorial image standard yet! This is rather pathetic.
IRT Zeus: I think so. The issues I have with SLI are:
- You are forced to get another card of the same model.
- The total cost of ownership is probably higher than the cost of a single, equally performed video card. (This gets even worse if you already have a suitable motherboard and you're upgrading to get a PCI-Express enabled one. PCI-Express is nowhere nearly necessary for today's and tomorrow's games by itself, so it would be an unncecessary upgrade if it weren't for SLI.)
- 2 6600s will provide far more power than you need for today's and tomorrow's games (a single 6600 would do fine), and in a few years, when games exploit all they can do, you may find them lacking features, like a new shader model, for example. No one knows what will they do in the future, and SLI means sticking to a particular model for longer, as your next upgrade will only add raw power.
- More power consumption and heat dissipation. You'd need a more powerful power supply and a good, well ventilated PC case than with a single graphics card.
I'm not saying 2 6600 would be a bad thing to have, but I think there are other options that are a little better, like a single 6600 or 6800 depending on your budget. I got a 6800 ATX and I'm more than happy with it. I've tried it with the latest games, and I could verify how it was never a bottleneck - not even for FarCry in maximum detail enabling HDR rendering the graphics card showed any slow down - it ran at exactly the same framerate than without any special effects. 2 6800s would be overdoing it; I'd rather wait three years and get a new card with new features by then.
In any case, I'll drop this link - this is a very serious, professional website made by people who know what they're doing (I can confirm this every time I read one of their articles), nothing like PC magazines. This is from where I get the information I need to decide on hardware upgrades:
Tom's Hardware Guide
<font color="#345E81" size="1">[ April 06, 2005 09:11 AM: Message edited by: -Wiseman- ]</font>
-
April 6th, 2005, 11:40 PM
#490
HB Forum Owner
IRT Melfice: Vertex Shaders or Pixel Shaders. Not brilliant names, I know.
Traditional graphics accelerators rendered textures in a fixed way, supporting a fixed set of features like environmental mapping, bump mapping, goraud shading, etc. Third generation graphics accelerators like the nVidia GeForce introduced hardware Tessellation and Lighting (T&L), which allowed developers to download a good part of the geometric processing from the CPU to the GPU.
Next generation graphics processors introduced two new features: Vertex Shaders, or the ability to customize the geometric processor with your own microcode, and Pixel Shaders, or the ability to customize the rendering pipeline with your own microcode. You write in a highly specialized assembly language for the GPU pipelines machine language, which, before the GeForce 6xxx series, had heavy limitations on the number of instructions it can do, but the result is certainly amazing. Now you can make vegetation move by itself. Your own shading models (Phong shading, metal shading, cel-shading...). Your own texture effects (diffuse shadows, refraction, recursive detail mapping, texture composition...). You can do really amazing things with this. Graphics accelerators no longer work in a fixed way; you decide how they should operate.
Vertex Shaders provide true, smooth acceleration for as much as you graphics card can handle. 3D APIs will take care of them if you don't have hardware support. Pixel Shaders aren't easily done in software though, and they are more easily perceived: they can do things you could never do with older graphics accelerators. First generation shaders were too limited and graphics cards were too slow with them, so they weren't much use except for a special effect here and there. Current generation (Radeon 9800, etc.) cards are much more powerful and you can rely on them a lot more. Games are starting to implement custom materials using custom shading models and custom special effects. Ever wondered why does real life gray plastic look so much like plastic, and gray metal so much like metal, while in regular 3D realtime graphics it's hard to tell the difference? Now you can.
Next generation graphics accelerators, namely ATI's Radeon Xxxx series and nVidia's GeForce 6xxx series, are the first graphics cards to provide strong, robust support for heavy, long, resourceful shaders (with less limitations on the number of textures you may address per pixel, length of code, etc.), and are able to handle complex pixel shaders for every material (especially the high end X800 and 6800 series). With this, you can expect games to get as good as what you can see in 3DMark 2003 - and even what you can see in 3DMark 2005, with a very fast processor.
New games like FarCry are a bliss. Take a look at this screenshot I took myself, for example:
http://web.madritel.es/personales1/c...FarCry0002.jpg
I have a GeForce 6800 card with a 3+ years old Athlon XP 1800+ processor. The game is set to the maximum possible detail, so is Direct3D. In this pic you can appreciate:
- Unholy polygon count
- High resolution textures
- Diffuse shadows (palmtrees)
- Specular highlighting (machinegun)
- ALL that vegetation is constantly, smoothly animated with Vertex Shaders (though you can't see this ;P).
- And most improtantly, an amazing Pixel Shader used for the water, supporting realistic deformed and decolored/reilluminated reflection, realistic deformed and foggy refraction, several layers of waves, decals, fog, and other incredible features, and of course, it's all mip-mapped and anisotropic filtered. It's a long shader, and slower graphics cards slow down when there's too much of the water surface visible on screen (you have 5 lower settings for it so you can make it faster).
- Too bad I forgot to enable the spectacular High Dynamic Range rendering effect. Or the cel-shading post-rendering effect, which doesn't look good but is even more taxing for the graphics card (only new generation graphics cards can really handle that in FarCry).
That part ran fairly smoothly in my machine. It slows down when there are too many active objects (mostly foes) on screen, because the bottleneck is always the processor, even when you're looking directly to the water. Enabling all effects plus the most complex post-rendering Pixel Shader effect has no impact on framerate vs. the absolutely lowest detail. What's funnier is my graphics card not even sweated - temperature after a 1 hour session of those graphics was exactly 50? - the same temperature it has when I'm in the desktop doing nothing. It's a really awesome card, I couldn't be happier with it.
Here you have another example:
http://web.madritel.es/personales1/c...FarCry0003.jpg
This is an interior. You can see how metallic the floor looks around that light, and how realistic is that shadowed smoke - but you'd have to see that in motion! Besides this, you can see all objects cast shadows.
As if this weren't enough, the game engine is simply brilliant. Shoot the ropes of a bridge and see how the enemy falls. Push objects like barrels - make them roll, push other objects, hit trees, etc. Throw them to the water and see if they float. Drive cars, boats, and even hang gliders. Spy what the IA does from a distance - how they talk to each other, and inform others if they see you or somebody is missing.
Google for more awesome FarCry (and "Far Cry") pics, search for people with 6800s and X800s for more shocking screenshots. Here are a few ones from the first Google Images page:
http://www.ownt.com/graphics/gallery/f/farcry/03.jpg
http://mach5.freeshell.org/images/ba...arcry-orig.jpg
http://reviews.zdnet.co.uk/i/z/rv/20...ry-300x262.jpg
http://img75.exs.cx/img75/2465/FarCr...-14-09-58-.jpg
http://farcry.gamesweb.com/farcry/im.../far_cry20.jpg
http://farcry.gamesweb.com/farcry/im.../far_cry04.jpg
What do you think?
<font color="#345E81" size="1">[ April 07, 2005 10:12 AM: Message edited by: -Wiseman- ]</font>
Posting Permissions
- You may not post new threads
- You may not post replies
- You may not post attachments
- You may not edit your posts
-
Forum Rules
Bookmarks